A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima

نویسندگان

چکیده

A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Finding Approximate Local Minima for Nonconvex Optimization in Linear Time

We design a non-convex second-order optimization algorithm that is guaranteed to return an approximate local minimum in time which is linear in the input representation. The previously fastest methods run in time proportional to matrix inversion or worse. The time complexity of our algorithm to find a local minimum is even faster than that of gradient descent to find a critical point (which can...

متن کامل

Global Convergence of Splitting Methods for Nonconvex Composite Optimization

We consider the problem of minimizing the sum of a smooth function h with a bounded Hessian, and a nonsmooth function. We assume that the latter function is a composition of a proper closed function P and a surjective linear map M, with the proximal mappings of τP , τ > 0, simple to compute. This problem is nonconvex in general and encompasses many important applications in engineering and mach...

متن کامل

Almost sure convergence of the forward-backward-forward splitting algorithm

In this paper, we propose a stochastic forward–backward–forward splitting algorithm and prove its almost sure weak convergence in real separable Hilbert spaces. Applications to composite monotone inclusion andminimization problems are demonstrated.

متن کامل

Nonconvex optimization using negative curvature within a modified linesearch

This paper describes a new algorithm for the solution of nonconvex unconstrained optimization problems, with the property of converging to points satisfying second-order necessary optimality conditions. The algorithm is based on a procedure which, from two descent directions, a Newton-type direction and a direction of negative curvature, selects in each iteration the linesearch model best adapt...

متن کامل

Local Convergence of Sequential Convex Programming for Nonconvex Optimization

where c ∈ R, g : R → R is non-linear and smooth on its domain, and Ω is a nonempty closed convex subset in R. This paper introduces sequential convex programming (SCP), a local optimization method for solving the nonconvex problem (P). We prove that under acceptable assumptions the SCP method locally converges to a KKT point of (P) and the rate of convergence is linear. Problems in the form of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Optimization

سال: 2021

ISSN: ['1095-7189', '1052-6234']

DOI: https://doi.org/10.1137/19m1264783